Error bound for Slope SVM in High Dimension

نویسنده

  • Antoine Dedieu
چکیده

In this paper, we propose a new estimator: the Slope SVM, which minimizes the hinge loss with the Slope penalization introduced by [3]. We study the asymptotical behavior of the `2 error between the theoretical hinge loss minimizer and the Slope estimator. We prove Slope achieves a (k/n) log(p/k) rate with high probability and in expectation under the Weighted Restricted Eigenvalue Condition. This bound is similar to the exact minimax one for regression and, to the best of our knowledge, it is the best achievable for a classification estimator.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

PREDICTION OF SLOPE STABILITY STATE FOR CIRCULAR FAILURE: A HYBRID SUPPORT VECTOR MACHINE WITH HARMONY SEARCH ALGORITHM

The slope stability analysis is routinely performed by engineers to estimate the stability of river training works, road embankments, embankment dams, excavations and retaining walls. This paper presents a new approach to build a model for the prediction of slope stability state. The support vector machine (SVM) is a new machine learning method based on statistical learning theory, which can so...

متن کامل

An Error Bound for L1-norm Support Vector Machine Coefficients in Ultra-high Dimension

Comparing with the standard L2-norm support vector machine (SVM), the L1-norm SVM enjoys the nice property of simultaneously preforming classification and feature selection. In this paper, we investigate the statistical performance of L1-norm SVM in ultra-high dimension, where the number of features p grows at an exponential rate of the sample size n. Different from existing theory for SVM whic...

متن کامل

Assessment and comparing of support vector machines model and regression equations for predicting alluvial channel geometry

Determine the stable channel geometry of the river is one of the most important topics in river engineering. Various relationships (based on statistical and theoretical methods) to predict the stable channels dimensions are expressed by many scientists. In this study, three Support Vector Machines (SVM) models are designed to predict width (w), depth (h) and slope (s) of stable channel. 85 cros...

متن کامل

Considering Span of Support Vector Bounds in the Context of Computational Learning Theory

This report describes a recent bound on expected support vector machine (SVM) generalization error [3] and frames this work in the context of computational learning theory [1], discussing the practical value of these bounds and the value to our mathematical understanding of machine learning. The fundamentals of computational learning theory are first outlined: PAC learning is defined, an exampl...

متن کامل

Optimization of the SVM Kernels Using an Empirical Error Minimization Scheme

We address the problem of optimizing kernel parameters in Support Vector Machine modelling, especially when the number of parameters is greater than one as in polynomial kernels and KMOD, our newly introduced kernel. The present work is an extended experimental study of the framework proposed by Chapelle et al. for optimizing SVM kernels using an analytic upper bound of the error. However, our ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017